home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Columbia Kermit
/
kermit.zip
/
newsgroups
/
misc.19971216-19980424
/
000240_news@newsmaster….columbia.edu _Tue Feb 17 14:53:12 1998.msg
< prev
next >
Wrap
Internet Message Format
|
2020-01-01
|
4KB
Return-Path: <news@newsmaster.cc.columbia.edu>
Received: from newsmaster.cc.columbia.edu (newsmaster.cc.columbia.edu [128.59.35.30])
by watsun.cc.columbia.edu (8.8.5/8.8.5) with ESMTP id OAA01179
for <kermit.misc@watsun.cc.columbia.edu>; Tue, 17 Feb 1998 14:53:12 -0500 (EST)
Received: (from news@localhost)
by newsmaster.cc.columbia.edu (8.8.5/8.8.5) id OAA25865
for kermit.misc@watsun; Tue, 17 Feb 1998 14:53:11 -0500 (EST)
Path: news.columbia.edu!panix!nntprelay.mathworks.com!newsfeed.direct.ca!newshub1.home.com!news.home.com!news.rdc1.sfba.home.net!cypher.cagent.com!user
From: tsw@cagent.com (Tom Watson)
Newsgroups: comp.protocols.kermit.misc
Subject: Re: k95 Setting Parity Bit?
Date: Tue, 17 Feb 1998 11:13:04 -0800
Organization: CagEnt, Inc.
Lines: 65
Message-ID: <tsw-1702981113040001@cypher.cagent.com>
References: <6c8b9l$84r$1@goanna.cs.rmit.edu.au> <6c9p95$cup$1@apakabar.cc.columbia.edu>
NNTP-Posting-Host: alfred.cagent.com
Cache-Post-Path: alfred.cagent.com!unknown@cypher.cagent.com
Xref: news.columbia.edu comp.protocols.kermit.misc:8430
In article <6c9p95$cup$1@apakabar.cc.columbia.edu>,
fdc@watsun.cc.columbia.edu (Frank da Cruz) wrote:
<<<deletia regarding how to set stop bits>>>
> There is presently no way to set stop bits in K95. To our knowledge, the
> last device to need any number of stop bits other than 1 was the Teletype
> machine, circa 1929, to give its big clunky print head time to turn around
> to home position between characters (these were still in use through about
> the mid 1970s, and I'm sure some are still in service today, but I don't
> think you'd be using K95 to communicate with one) (*).
>
<<<deletia>>>
>
> (*) Some Telecommunications Devices for the Deaf (TDDs) might still use
> Teletypes, but K95 could not communicate with them anyway, since they
> use 5-bit Baudot code rather than ASCII. Most modern TDDs use ASCII.
Some history on stop bits:
True that 5-level (Baudot) teletypes used a different stop bit length.
Typically it was 1.5 times the normal bit size. This was for ONLY 5 level
machines. On the other hand, the "modern" teletype Model 33 and Model 35,
which both ran at 110 bps used TWO stop bits. These were very popular
until at least the mid to late 70's.
General rules (there are probably exceptions):
5 level machines 1.5 stop bits (baudot code)
110 bps machines, ASCII coded (TTY 33, TTY 35) 2 stop bits.
All other machines - 1 stop bit.
Actually the "stop bit" can be longer than the above, as it is the 'idle'
state of the line. The above are the minimum allowed.
Early UART chips allowed for different character lengths. These were 5,
6, 7, and 8 bits per character (data). Another pin was the "1, or other
than 1" stop bit(s). For 5 bits per character, this was set at 1.5 bits,
otherwise, 2 stop bits. Parity was "in addition to" the number of data
bits.
This gives us the current scheme of 7E1, and 8N1, both having 8 bits
between the start and stop bits.
Other schemes were 6 bit characters with parity (total of 7 bits) used for
IBM 2741 terminals. These beasts had all sorts of wierd protocols to turn
around the line, as they were half duplex. Since they ran at 15
characters/sec, the bit rate was 134.5 bps to get the timing "just
right". These relics still have a legacy in the fact that the bit rate of
134.5 can usually be set on many UARTs today.
The only thing that having TWO stop bits adds is a little bit of delay
between characters. But that will lead to problems. In one case, I had a
machine that (by default) used two stop bits, but was sent data with one
stop bit. On the reception end, this doesn't matter (two stop bits only
is relevant to sending). The problem arose when the remote machine (with
two stop bits) was echoing characters. It had only a one character
buffer, and soon (about 1/2 way thru a line of text), it couldn't keep
up. This caused a lost character in the receive stream. Turning off the
echo, or changing to one stop bit cured the problem.
Moral: one stop bit is enough.
--
tsw@cagent.com (Home: tsw@johana.com)
Please forward spam to: annagram@hr.house.gov (my Congressman), I do.